Kernel discriminant analysis for regression problems

نویسنده

  • Nojun Kwak
چکیده

In this paper, we propose a nonlinear feature extraction method for regression problems to reduce the dimensionality of the input space. Previously, a feature extraction method LDAr, a regressional version of the linear discriminant analysis, was proposed. In this paper, LDAr is generalized to a non-linear discriminant analysis by using the so called kernel trick. The basic idea is to map the input space into a high-dimensional feature space where the variables are nonlinear transformations of input variables. Then we try to maximize the ratio of distances of samples with large differences in the target value and those with small differences in the target value in the feature space. It is well known that the distribution of face images, under a perceivable variation in translation, rotation, and scaling, is highly nonlinear and the face alignment problem is a complex regression problem. We have applied the proposed method to various regression problems including face alignment problems and achieved better performances than those of conventional linear feature extraction methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Kernel Fisher Discriminant Analysis

Kernel methods have become standard tools for solving classification and regression problems in statistics. An example of a kernel based classification method is Kernel Fisher discriminant analysis (KFDA), a kernel based extension of linear discriminant analysis (LDA), which was proposed by Mika et al. (1999). As in the case of LDA, the classification performance of KFDA deteriorates in the pre...

متن کامل

Regularized Discriminant Analysis, Ridge Regression and Beyond

Fisher linear discriminant analysis (FDA) and its kernel extension—kernel discriminant analysis (KDA)—are well known methods that consider dimensionality reduction and classification jointly. While widely deployed in practical problems, there are still unresolved issues surrounding their efficient implementation and their relationship with least mean squares procedures. In this paper we address...

متن کامل

Neural Class-Specific Regression for face verification

Face verification is a problem approached in the literature mainly using nonlinear class-specific subspace learning techniques. While it has been shown that kernel-based ClassSpecific Discriminant Analysis is able to provide excellent performance in smalland medium-scale face verification problems, its application in today’s large-scale problems is difficult due to its training space and comput...

متن کامل

Fisher’s Linear Discriminant Analysis for Weather Data by reproducing kernel Hilbert spaces framework

Recently with science and technology development, data with functional nature are easy to collect. Hence, statistical analysis of such data is of great importance. Similar to multivariate analysis, linear combinations of random variables have a key role in functional analysis. The role of Theory of Reproducing Kernel Hilbert Spaces is very important in this content. In this paper we study a gen...

متن کامل

Regression Optimized Kernel for High-level Speaker Verification

Computing the likelihood-ratio (LR) score of a test utterance is an important step in speaker verification. It has recently been shown that for discrete speaker models, the LR scores can be expressed as dot products between supervectors formed by the test utterance, target-speaker model, and background model. This paper leverages this dot-product formulation and the representer theorem to deriv...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Pattern Recognition

دوره 45  شماره 

صفحات  -

تاریخ انتشار 2012